30 research outputs found

    Forecasting using Bayesian and Information Theoretic Model Averaging: An Application to UK Inflation

    Get PDF
    In recent years there has been increasing interest in forecasting methods that utilise large datasets, driven partly by the recognition that policymaking institutions need to process large quantities of information. Factor analysis is one popular way of doing this. Forecast combination is another, and it is on this that we concentrate. Bayesian model averaging methods have been widely advocated in this area, but a neglected frequentist approach is to use information theoretic based weights. We consider the use of model averaging in forecasting UK inflation with a large dataset from this perspective. We find that an information theoretic model averaging scheme can be a powerful alternative both to the more widely used Bayesian model averaging scheme and to factor models.Forecasting, Inflation, Bayesian model averaging, Akaike criteria, Forecast combining

    Are more data always better for factor analysis? Results for the euro area, the six largest euro area countries and the UK

    Get PDF
    Factor based forecasting has been at the forefront of developments in the macroeconometric forecasting literature in the recent past. Despite the flurry of activity in the area, a number of specification issues such as the choice of the number of factors in the forecasting regression, the benefits of combining factor-based forecasts and the choice of the dataset from which to extract the factors remain partly unaddressed. This paper provides a comprehensive empirical investigation of these issues using data for the euro area, the six largest euro area countries, and the UK. JEL Classification: C100, C150, C530Factors, Forecast Combinations, Large Datasets

    A State Space Approach to Extracting the Signal from Uncertain Data

    Get PDF
    Most macroeconomic data are uncertain - they are estimates rather than perfect measures of underlying economic variables. One symptom of that uncertainty is the propensity of statistical agencies to revise their estimates in the light of new information or methodological advances. This paper sets out an approach for extracting the signal from uncertain data. It describes a two-step estimation procedure in which the history of past revisions are first used to estimate the parameters of a measurement equation describing the official published estimates. These parameters are then imposed in a maximum likelihood estimation of a state space model for the macroeconomic variable.Real-time data analysis, State space models, Data uncertainty, Data revisions

    Digitalisation, institutions and governance, and diffusion: Mechanisms and evidence

    Get PDF
    Digitalisation can be described as a sequence of technology and supply shocks which affect the economy through employment and labour markets, productivity and output, and competition and market structure. This paper focuses on how digitalisation-the process of diffusion of digital technologies-is affected by institutions and governance. It discusses a number of theoretical mechanisms and empirical evidence for different sets of European and other countries. The results indicate that a higher quality of institutions is usually associated with both a greater speed of diffusion and a greater spread of digital technologies. The results also suggest that there are large, policy-relevant differences in the diffusion process depending on the level of development as well as the state of technological change of a country.</p

    Sectoral specialisation in the EU a macroeconomic perspective

    Get PDF
    This paper analyses trends in sectoral specialisation in the EU and concludes the following: 1) The European production structure appears more homogenous than that of the US. 2) While sectoral specialisation has shown a slight increase in some smaller euro area countries towards the end-1990s, it is too early to detect any potential impact of EMU. 3) Despite some changes in sectoral composition, the business cycles of euro area countries became more synchronised over the 1990s, which may be seen as reassuring from the point of view of the single monetary policy. 4) Sectoral re-allocation accounts for as much as 50% of the increase in labour productivity growth in business sector services in the euro area. 5) The slowdown of European labour productivity growth relative to the US since the mid-1990s is explained by a stronger performance in the US wholesale and retail trade, financial intermediation and high-tech manufacturing sectors.

    What caused the 2000/01 slowdown? Results from a VAR analysis of G7 GDP components

    No full text
    In this paper a VAR-based analysis of shocks to G7 GDP components during the 2000/01 slowdown is presented. The patterns of shocks across the components and across the G7 countries are documented, and measures provided of their persistence. The shocks during the preceding expansion are also considered, and are used to discuss possible business cycle asymmetries, and a comparison made with the pattern of shocks during the previous slowdown in 1990. The analysis is then extended to derive shocks to components that explicitly take into account the roles played by monetary policy and oil prices in 2000/01.
    corecore